34 research outputs found

    Towards Real-time Mixed Reality Matting In Natural Scenes

    Get PDF
    In Mixed Reality scenarios, background replacement is a common way to immerse a user in a synthetic environment. Properly identifying the background pixels in an image or video is a dif- ficult problem known as matting. Proper alpha mattes usually come from human guidance, special hardware setups, or color dependent algorithms. This is a consequence of the under-constrained nature of the per pixel alpha blending equation. In constant color matting, research identifies and replaces a background that is a single color, known as the chroma key color. Unfortunately, the algorithms force a controlled physical environment and favor constant, uniform lighting. More generic approaches, such as natural image matting, have made progress finding alpha matte solutions in environments with naturally occurring backgrounds. However, even for the quicker algorithms, the generation of trimaps, indicating regions of known foreground and background pixels, normally requires human interaction or offline computation. This research addresses ways to automatically solve an alpha matte for an image in realtime, and by extension a video, using a consumer level GPU. It does so even in the context of noisy environments that result in less reliable constraints than found in controlled settings. To attack these challenges, we are particularly interested in automatically generating trimaps from depth buffers for dynamic scenes so that algorithms requiring more dense constraints may be used. The resulting computation is parallelizable so that it may run on a GPU and should work for natural images as well as chroma key backgrounds. Extra input may be required, but when this occurs, commodity hardware available in most Mixed Reality setups should be able to provide the input. This allows us to provide real-time alpha mattes for Mixed Reality scenarios that take place in relatively controlled environments. As a consequence, while monochromatic backdrops (such as green screens or retro-reflective material) aid the algorithm’s accuracy, they are not an explicit requirement. iii Finally we explore a sub-image based approach to parallelize an existing hierarchical approach on high resolution imagery. We show that locality can be exploited to significantly reduce the memory and compute requirements of previously necessary when computing alpha mattes of high resolution images. We achieve this using a parallelizable scheme that is both independent of the matting algorithm and image features. Combined, these research topics provide a basis for Mixed Reality scenarios using real-time natural image matting on high definition video sources

    Real-Time Chromakey Matting Using Image Statistics

    Get PDF
    Given a video signal, we generate an alpha matte based on the chromakey information. The computation is done in interactive-time using pixel shaders. To accomplish this, we use Principle Components Analysis to generate a linear transformation matrix where the resulting color triplets Euclidean distance is directly related to the probability that the color exists in the chromakey spectrum. The result of this process is a trimap of the video signals opacity. To solve the alpha matte from the trimap, we minimize an energy function constrained by the trimap with gradient descent. This energy function is based on the least-squared error of overlapping neighborhoods around each pixel and is independent of the background or foreground color

    Picbreeder: A Case Study in Collaborative Evolutionary Exploration of Design Space

    Get PDF
    For domains in which fitness is subjective or difficult to express formally, interactive evolutionary computation (IEC) is a natural choice. It is possible that a collaborative process combining feedback from multiple users can improve the quality and quantity of generated artifacts. Picbreeder, a large-scale online experiment in collaborative interactive evolution (CIE), explores this potential. Picbreeder is an online community in which users can evolve and share images, and most importantly, continue evolving others\u27 images. Through this process of branching from other images, and through continually increasing image complexity made possible by the underlying neuroevolution of augmenting topologies (NEAT) algorithm, evolved images proliferate unlike in any other current IEC system. This paper discusses not only the strengths of the Picbreeder approach, but its challenges and shortcomings as well, in the hope that lessons learned will inform the design of future CIE systems

    Obeticholic acid for the treatment of non-alcoholic steatohepatitis: interim analysis from a multicentre, randomised, placebo-controlled phase 3 trial

    Get PDF
    Background Non-alcoholic steatohepatitis (NASH) is a common type of chronic liver disease that can lead to cirrhosis. Obeticholic acid, a farnesoid X receptor agonist, has been shown to improve the histological features of NASH. Here we report results from a planned interim analysis of an ongoing, phase 3 study of obeticholic acid for NASH. Methods In this multicentre, randomised, double-blind, placebo-controlled study, adult patients with definite NASH,non-alcoholic fatty liver disease (NAFLD) activity score of at least 4, and fibrosis stages F2–F3, or F1 with at least oneaccompanying comorbidity, were randomly assigned using an interactive web response system in a 1:1:1 ratio to receive oral placebo, obeticholic acid 10 mg, or obeticholic acid 25 mg daily. Patients were excluded if cirrhosis, other chronic liver disease, elevated alcohol consumption, or confounding conditions were present. The primary endpointsfor the month-18 interim analysis were fibrosis improvement (≥1 stage) with no worsening of NASH, or NASH resolution with no worsening of fibrosis, with the study considered successful if either primary endpoint was met. Primary analyses were done by intention to treat, in patients with fibrosis stage F2–F3 who received at least one dose of treatment and reached, or would have reached, the month 18 visit by the prespecified interim analysis cutoff date. The study also evaluated other histological and biochemical markers of NASH and fibrosis, and safety. This study is ongoing, and registered with ClinicalTrials.gov, NCT02548351, and EudraCT, 20150-025601-6. Findings Between Dec 9, 2015, and Oct 26, 2018, 1968 patients with stage F1–F3 fibrosis were enrolled and received at least one dose of study treatment; 931 patients with stage F2–F3 fibrosis were included in the primary analysis (311 in the placebo group, 312 in the obeticholic acid 10 mg group, and 308 in the obeticholic acid 25 mg group). The fibrosis improvement endpoint was achieved by 37 (12%) patients in the placebo group, 55 (18%) in the obeticholic acid 10 mg group (p=0·045), and 71 (23%) in the obeticholic acid 25 mg group (p=0·0002). The NASH resolution endpoint was not met (25 [8%] patients in the placebo group, 35 [11%] in the obeticholic acid 10 mg group [p=0·18], and 36 [12%] in the obeticholic acid 25 mg group [p=0·13]). In the safety population (1968 patients with fibrosis stages F1–F3), the most common adverse event was pruritus (123 [19%] in the placebo group, 183 [28%] in the obeticholic acid 10 mg group, and 336 [51%] in the obeticholic acid 25 mg group); incidence was generally mild to moderate in severity. The overall safety profile was similar to that in previous studies, and incidence of serious adverse events was similar across treatment groups (75 [11%] patients in the placebo group, 72 [11%] in the obeticholic acid 10 mg group, and 93 [14%] in the obeticholic acid 25 mg group). Interpretation Obeticholic acid 25 mg significantly improved fibrosis and key components of NASH disease activity among patients with NASH. The results from this planned interim analysis show clinically significant histological improvement that is reasonably likely to predict clinical benefit. This study is ongoing to assess clinical outcomes

    Real-Time Video Matting For Mixed Reality Using Depth Generated Trimaps

    No full text
    In Mixed Reality scenarios, background replacement is a common way to immerse a user in a synthetic environment. Properly identifying the background pixels in an image or video is a difficult problem known as matting. Proper alpha mattes usually come from human guidance, special hardware setups, or color dependent algorithms. This is a consequence of the under-constrained nature of the per pixel alpha blending equation. While the field of natural image matting has made progress finding a least squares solution for an alpha matte, the generation of trimaps, indicating regions of known foreground and background pixels, normally requires human interaction or offline computation. We overcome these limitations by combining a low fidelity depth image that segments the original video signal with a real-time parallel natural image matting technique that favors objects with similar colors in the foreground and background. This allows us to provide real-time alpha mattes for Mixed Reality scenarios that take place in relatively controlled environments. As a consequence, while monochromatic backdrops (such as green screens or retro-reflective material) aid the algorithm\u27s accuracy, they are not an explicit requirement

    Picbreeder: Collaborative Interactive Evolution of Images

    No full text
    Picbreeder [1] is a new website that is open to the public for collaborative interactive evolution of images. A unique feature of Picbreeder is that users can continue evolving other users’ images by branching. The continual process of evolving and branching means that images can continue to improve and increase in complexity indefinitely, yielding a proliferation of artistic novelty that requires no explicit artistic talent to produce. Interactive Evolutionary Computation Picbreeder borrows ideas from Evolutionary Computation (EC), which allows computers to produce a myriad of digital artifacts, from circuit designs to neural networks, by emulating the process of natural selection. In EC, a population of individuals is evaluated for fitness and mutated and/or mated, to produce the next generation. This cycle continues until evolution produces an individual considered significant. Interactive Evolutionary Computation (IEC), originally explored by Dawkins [2], is a type of EC in which a human evaluates the fitness of individuals (see Takagi [3] for a review). This technique is particularly effective at evolving artifacts that are too subjective for the computer to evaluate itself, including artwork, music, and designs. Picbreeder uses a specialized evolutionary algorithm called Compositiona

    Interactive Chroma Keying For Mixed Reality

    No full text
    In Mixed Reality (MR) applications, immersion of virtual objects in captured video contributes to the perceived unification of two worlds, one real, one synthetic. Since virtual actors and surround may appear both closer and farther than real objects, compositing must consider spatial relationships in the resulting world. Chroma keying, often called blue screening or green screening, is one common solution to this problem. This method is under-constrained and most commonly addressed through a combination of environment preparation and commercial products. In interactive MR domains that impose restrictions on the video camera hardware, such as in experiences using video see-through (VST) head-mounted displays (HMD), chroma keying becomes even more difficult due to the relatively low camera quality, the use of multiple camera sources (one per eye), and the required processing speed. Dealing with these constraints requires a fast and affordable solution. In our approach, we precondition the chroma key by using principal component analysis (PCA) to obtain usable alpha mattes from video streams in real-time on commodity graphics processing units (GPUs). In addition, we demonstrate how our method compares to off-line commercial keying tools and how it performs with respect to signal noise within the video stream. © 2009 John Wiley & Sons, Ltd

    Picbreeder: Evolving Pictures Collaboratively Online

    No full text
    Picbreeder is an online service that allows users to collaboratively evolve images. Like in other Interactive Evolutionary Computation (IEC) programs, users evolve images in Picbreeder by selecting ones that appeal to them to produce a new generation. However, Picbreeder also offers an online community in which to share these images, and most importantly, the ability to continue evolving others\u27 images. Through this process of branching from other images, and through continually increasing image complexity made possible by the NeuroEvolution of Augmenting Topologies (NEAT) algorithm, evolved images proliferate unlike in any other current IEC systems. Picbreeder enables all users, regardless of talent, to participate in a creative, exploratory process. This paper details how Picbreeder encourages innovation, featuring images that were collaboratively evolved. Copyright 2008 ACM
    corecore